IO-VNBD: Inertial and Odometry benchmark dataset for ground vehicle positioning

نویسندگان

چکیده

Low-cost inertial navigation sensors (INS) can be exploited for a reliable tracking solution autonomous vehicles. However, position errors grow exponentially due to noises in the measurements. Several deep learning techniques have been investigated mitigate better [1-10]. these studies involved use of different datasets not made publicly available. The lack robust benchmark dataset has thus hindered advancement research, comparison and adoption vehicle positioning based on navigation. In order facilitate benchmarking, fast development evaluation algorithms, we therefore present first its kind large-scale information-rich odometry focused public called IO-VNBD (Inertial Odometry Vehicle Navigation Benchmark Dataset).The was recorded using research equipped with ego-motion roads United Kingdom, Nigeria, France. include GPS receiver, sensors, wheel-speed amongst other found car as well receiver an android smart phone sampling at 10HZ. A diverse number scenarios dynamics are captured such traffic, round-abouts, hard-braking etc. road types (country roads, motorways etc.) varying driving patterns. consists total time about 40 hours over 1,300km extracted data 58 4,400 km smartphone data. We hope that this will prove valuable furthering correlation between displacement related

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Visual odometry for ground vehicle applications

We present a system that estimates the motion of a stereo head or a single moving camera based on video input. The system operates in real-time with low delay and the motion estimates are used for navigational purposes. The front end of the system is a feature tracker. Point features are matched between pairs of frames and linked into image trajectories at video rate. Robust estimates of the ca...

متن کامل

A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots

Flying robots require a combination of accuracy and low latency in their state estimation in order to achieve stable and robust flight. However, due to the power and payload constraints of aerial platforms, state estimation algorithms must provide these qualities under the computational constraints of embedded hardware. Cameras and inertial measurement units (IMUs) satisfy these power and paylo...

متن کامل

Inertial Odometry on Handheld Smartphones

Building a complete inertial navigation system using the limited quality data provided by current smartphones has been regarded challenging, if not impossible. We present a probabilistic approach for orientation and use-case free inertial odometry, which is based on double-integrating rotated accelerations. Our approach uses a probabilistic approach in fusing the noisy sensor data and learning ...

متن کامل

Inertial Senso 20 . Inertial Sensors , GPS , and Odometry

This chapter examines how certain properties of the world can be exploited in order for a robot or other device to develop a model of its own motion or pose (position and orientation) relative to an external frame of reference. Although this is a critical problem for many autonomous robotic systems, the problem of establishing and maintaining an orientation or position estimate of a mobile agen...

متن کامل

Correlation-based visual odometry for ground vehicles

Reliable motion estimation is a key component for autonomous vehicles. We present a visual odometry method for ground vehicles using template matching. The method uses a downward facing camera perpendicular to the ground and estimates the motion of the vehicle by analyzing the image shift from frame to frame. Specifically, an image region (template) is selected and using correlation we find the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Data in Brief

سال: 2021

ISSN: ['2352-3409']

DOI: https://doi.org/10.1016/j.dib.2021.106885